14 research outputs found

    Towards a Decentralized Metaverse: Synchronized Orchestration of Digital Twins and Sub-Metaverses

    Full text link
    Accommodating digital twins (DTs) in the metaverse is essential to achieving digital reality. This need for integrating DTs into the metaverse while operating them at the network edge has increased the demand for a decentralized edge-enabled metaverse. Hence, to consolidate the fusion between real and digital entities, it is necessary to harmonize the interoperability between DTs and the metaverse at the edge. In this paper, a novel decentralized metaverse framework that incorporates DT operations at the wireless edge is presented. In particular, a system of autonomous physical twins (PTs) operating in a massively-sensed zone is replicated as cyber twins (CTs) at the mobile edge computing (MEC) servers. To render the CTs' digital environment, this zone is partitioned and teleported as distributed sub-metaverses to the MEC servers. To guarantee seamless synchronization of the sub-metaverses and their associated CTs with the dynamics of the real world and PTs, respectively, this joint synchronization problem is posed as an optimization problem whose goal is to minimize the average sub-synchronization time between the real and digital worlds, while meeting the DT synchronization intensity requirements. To solve this problem, a novel iterative algorithm for joint sub-metaverse and DT association at the MEC servers is proposed. This algorithm exploits the rigorous framework of optimal transport theory so as to efficiently distribute the sub-metaverses and DTs, while considering the computing and communication resource allocations. Simulation results show that the proposed solution can orchestrate the interplay between DTs and sub-metaverses to achieve a 25.75 % reduction in the sub-synchronization time in comparison to the signal-to-noise ratio-based association scheme

    Can Terahertz Provide High-Rate Reliable Low Latency Communications for Wireless VR?

    Full text link
    Wireless virtual reality (VR) imposes new visual and haptic requirements that are directly linked to the quality-of-experience (QoE) of VR users. These QoE requirements can only be met by wireless connectivity that offers high-rate and high-reliability low latency communications (HRLLC), unlike the low rates usually considered in vanilla ultra-reliable low latency communication scenarios. The high rates for VR over short distances can only be supported by an enormous bandwidth, which is available in terahertz (THz) frequency bands. Guaranteeing HRLLC requires dealing with the uncertainty that is specific to the THz channel. To explore the potential of THz for meeting HRLLC requirements, a quantification of the risk for an unreliable VR performance is conducted through a novel and rigorous characterization of the tail of the end-to-end (E2E) delay. Then, a thorough analysis of the tail-value-atrisk (TVaR) is performed to concretely characterize the behavior of extreme wireless events crucial to the real-time VR experience. System reliability for scenarios with guaranteed line-of-sight (LoS) is then derived as a function of THz network parameters after deriving a novel expression for the probability distribution function of the THz transmission delay. Numerical results show that abundant bandwidth and low molecular absorption are necessary to improve the reliability. However, their effect remains secondary compared to the availability of LoS, which significantly affects the THz HRLLC performance. In particular, for scenarios with guaranteed LoS, a reliability of 99.999% (with an E2E delay threshold of 20 ms) for a bandwidth of 15 GHz along with data rates of 18.3 Gbps can be achieved by the THz network (operating at a frequency of 1 THz), compared to a reliability of 96% for twice the bandwidth, when blockages are considered.Comment: arXiv admin note: text overlap with arXiv:1905.0765

    Causal Reasoning: Charting a Revolutionary Course for Next-Generation AI-Native Wireless Networks

    Full text link
    Despite the basic premise that next-generation wireless networks (e.g., 6G) will be artificial intelligence (AI)-native, to date, most existing efforts remain either qualitative or incremental extensions to existing ``AI for wireless'' paradigms. Indeed, creating AI-native wireless networks faces significant technical challenges due to the limitations of data-driven, training-intensive AI. These limitations include the black-box nature of the AI models, their curve-fitting nature, which can limit their ability to reason and adapt, their reliance on large amounts of training data, and the energy inefficiency of large neural networks. In response to these limitations, this article presents a comprehensive, forward-looking vision that addresses these shortcomings by introducing a novel framework for building AI-native wireless networks; grounded in the emerging field of causal reasoning. Causal reasoning, founded on causal discovery, causal representation learning, and causal inference, can help build explainable, reasoning-aware, and sustainable wireless networks. Towards fulfilling this vision, we first highlight several wireless networking challenges that can be addressed by causal discovery and representation, including ultra-reliable beamforming for terahertz (THz) systems, near-accurate physical twin modeling for digital twins, training data augmentation, and semantic communication. We showcase how incorporating causal discovery can assist in achieving dynamic adaptability, resilience, and cognition in addressing these challenges. Furthermore, we outline potential frameworks that leverage causal inference to achieve the overarching objectives of future-generation networks, including intent management, dynamic adaptability, human-level cognition, reasoning, and the critical element of time sensitivity

    Seven Defining Features of Terahertz (THz) Wireless Systems: A Fellowship of Communication and Sensing

    Full text link
    Wireless communication at the terahertz (THz) frequency bands (0.1-10THz) is viewed as one of the cornerstones of tomorrow's 6G wireless systems. Owing to the large amount of available bandwidth, THz frequencies can potentially provide wireless capacity performance gains and enable high-resolution sensing. However, operating a wireless system at the THz-band is limited by a highly uncertain channel. Effectively, these channel limitations lead to unreliable intermittent links as a result of a short communication range, and a high susceptibility to blockage and molecular absorption. Consequently, such impediments could disrupt the THz band's promise of high-rate communications and high-resolution sensing capabilities. In this context, this paper panoramically examines the steps needed to efficiently deploy and operate next-generation THz wireless systems that will synergistically support a fellowship of communication and sensing services. For this purpose, we first set the stage by describing the fundamentals of the THz frequency band. Based on these fundamentals, we characterize seven unique defining features of THz wireless systems: 1) Quasi-opticality of the band, 2) THz-tailored wireless architectures, 3) Synergy with lower frequency bands, 4) Joint sensing and communication systems, 5) PHY-layer procedures, 6) Spectrum access techniques, and 7) Real-time network optimization. These seven defining features allow us to shed light on how to re-engineer wireless systems as we know them today so as to make them ready to support THz bands. Furthermore, these features highlight how THz systems turn every communication challenge into a sensing opportunity. Ultimately, the goal of this article is to chart a forward-looking roadmap that exposes the necessary solutions and milestones for enabling THz frequencies to realize their potential as a game changer for next-generation wireless systems.Comment: 26 pages, 6 figure

    Joint Location, Bandwidth and Power Optimization for THz-enabled UAV Communications

    Full text link
    In this paper, the problem of unmanned aerial vehicle (UAV) deployment, power allocation, and bandwidth allocation is investigated for a UAV-assisted wireless system operating at terahertz (THz) frequencies. In the studied model, one UAV can service ground users using the THz frequency band. However, the highly uncertain THz channel will introduce new challenges to the UAV location, user power, and bandwidth allocation optimization problems. Therefore, it is necessary to design a novel framework to deploy UAVs in the THz wireless systems. This problem is formally posed as an optimization problem whose goal is to minimize the total delays of the uplink and downlink transmissions between the UAV and the ground users by jointly optimizing the deployment of the UAV, the transmit power and the bandwidth of each user. The communication delay is crucial for emergency communications. To tackle this nonconvex delay minimization problem, an alternating algorithm is proposed while iteratively solving three subproblems: location optimization subproblem, power control subproblem, and bandwidth allocation subproblem. Simulation results show that the proposed algorithm can reduce the transmission delay by up to 59.3%59.3\%, 49.8%49.8\% and 75.5%75.5\% respectively compared to baseline algorithms that optimize only UAV location, bandwidth allocation or transmit power control.Comment: 5 pages IEEE Communications Letter

    Seven Defining Features of Terahertz (THz) Wireless Systems: A Fellowship of Communication and Sensing

    No full text

    Risk-Based Optimization of Virtual Reality over Terahertz Reconfigurable Intelligent Surfaces

    No full text

    Lifelong Learning for Minimizing Age of Information in Internet of Things Networks

    No full text
    In this paper, a lifelong learning problem is studied for an Internet of Things (IoT) system. In the considered model, each IoT device aims to balance its information freshness and energy consumption tradeoff by controlling its computational resource allocation at each time slot under dynamic environments. An unmanned aerial vehicle (UAV) is deployed as a flying base station so as to enable the IoT devices to adapt to novel environments. To this end, a new lifelong reinforcement learning algorithm, used by the UAV, is proposed in order to adapt the operation of the devices at each visit by the UAV. By using the experience from previously visited devices and environments, the UAV can help devices adapt faster to future states of their environment. To do so, a knowledge base shared by all devices is maintained at the UAV. Simulation results show that the proposed algorithm can converge 25% to 50% faster than a policy gradient baseline algorithm that optimizes each device's decision making problem in isolation
    corecore